Sparse Bayes Inference by Annealing Entropy
نویسندگان
چکیده
We bring a novel Bayesian computing to find sparse estimates for a range of statistical models. Very often, the problem of sparsity identification requires substantial efforts necessary to solve a hard combinatorial optimization involved in a large configuration space of sparsity patterns. For numerous existing methods, presence of many improper local optima is the cause of difficulty yet to be developed. The essence of our approach is the optimization of the augmented posterior distribution, i.e. the maximum a posteriori (MAP), with regard to sparsity configurations and model parameters. To realize an efficient MAP computing, we impose an artificial regularizer on the posterior entropy of sparsity configurations where the degree of regularization is dynamically controlled by a meta parameter called temperature. Our algorithm prescribes a schedule for lowering temperature so as to decay slowly and reach to zero, and then proceeds with iterativeoptimization over sparsity configurations and parameters while keeping on the cooling schedule. The limiting zero temperature yields an exact MAP estimate which would be the global or near-optimal posterior mode. We detail the procedure of our computing especially for a latent factor model, and some intrinsic natures of the annealing.
منابع مشابه
Quantum Annealing for Variational Bayes Inference
This paper presents studies on a deterministic annealing algorithm based on quantum annealing for variational Bayes (QAVB) inference, which can be seen as an extension of the simulated annealing for variational Bayes (SAVB) inference. QAVB is as easy as SAVB to implement. Experiments revealed QAVB finds a better local optimum than SAVB in terms of the variational free energy in latent Dirichlet...
متن کاملEstimation for the Type-II Extreme Value Distribution Based on Progressive Type-II Censoring
In this paper, we discuss the statistical inference on the unknown parameters and reliability function of type-II extreme value (EVII) distribution when the observed data are progressively type-II censored. By applying EM algorithm, we obtain maximum likelihood estimates (MLEs). We also suggest approximate maximum likelihood estimators (AMLEs), which have explicit expressions. We provide Bayes ...
متن کاملUpdating Probabilities
The Method of Maximum (relative) Entropy (ME) has been designed for updating from a prior distribution to a posterior distribution when the information being processed is in the form of a constraint on the family of allowed posteriors. This is in contrast with the usual MaxEnt which was designed as a method to assign, and not to update, probabilities. The objective of this paper is to strengthe...
متن کاملInference algorithms and learning theory for Bayesian sparse factor analysis
Bayesian sparse factor analysis has many applications; for example, it has been applied to the problem of inferring a sparse regulatory network from gene expression data. We describe a number of inference algorithms for Bayesian sparse factor analysis using a slab and spike mixture prior. These include well-established Markov chain Monte Carlo (MCMC) and variational Bayes (VB) algorithms as wel...
متن کاملA Simulated Annealing Approach to Bayesian Inference
A generic algorithm for the extraction of probabilistic (Bayesian) information about model parameters from data is presented. The algorithm propagates an ensemble of particles in the product space of model parameters and outputs. Each particle update consists of a random jump in parameter space followed by a simulation of a model output and a Metropolis acceptance/rejection step based on a comp...
متن کامل